54 research outputs found

    Autonomous supervision and optimization of product quality in a multi-stage manufacturing process based on self-adaptive prediction models.

    Get PDF
    In modern manufacturing facilities, there are basically two essential phases for assuring high production quality with low (or even zero) defects and waste in order to save costs for companies. The first phase concerns the early recognition of potentially arising problems in product quality, the second phase concerns proper reactions upon the recognition of such problems. In this paper, we address a holistic approach for handling both issues consecutively within a predictive maintenance framework at an on-line production system. Thereby, we address multi-stage functionality based on (i) data-driven forecast models for (measure-able) product quality criteria (QCs) at a latter stage, which are established and executed through process values (and their time series trends) recorded at an early stage of production (describing its progress), and (ii) process optimization cycles whose outputs are suggestions for proper reactions at an earlier stage in the case of forecasted downtrends or exceeds of allowed boundaries in product quality. The data-driven forecast models are established through a high-dimensional batch time-series modeling problem. In this, we employ a non-linear version of PLSR (partial least squares regression) by coupling PLS with generalized Takagi–Sugeno fuzzy systems (termed as PLS-fuzzy). The models are able to self-adapt over time based on recursive parameters adaptation and rule evolution functionalities. Two concepts for increased flexibility during model updates are proposed, (i) a dynamic outweighing strategy of older samples with an adaptive update of the forgetting factor (steering forgetting intensity) and (ii) an incremental update of the latent variable space spanned by the directions (loading vectors) achieved through PLS; the whole model update approach is termed as SAFM-IF (self-adaptive forecast models with increased flexibility). Process optimization is achieved through multi-objective optimization using evolutionary techniques, where the (trained and updated) forecast models serve as surrogate models to guide the optimization process to Pareto fronts (containing solution candidates) with high quality. A new influence analysis between process values and QCs is suggested based on the PLS-fuzzy forecast models in order to reduce the dimensionality of the optimization space and thus to guarantee high(er) quality of solutions within a reasonable amount of time (→ better usage in on-line mode). The methodologies have been comprehensively evaluated on real on-line process data from a (micro-fluidic) chip production system, where the early stage comprises the injection molding process and the latter stage the bonding process. The results show remarkable performance in terms of low prediction errors of the PLS-fuzzy forecast models (showing mostly lower errors than achieved by other model architectures) as well as in terms of Pareto fronts with individuals (solutions) whose fitness was close to the optimal values of three most important target QCs (being used for supervision): flatness, void events and RMSEs of the chips. Suggestions could thus be provided to experts/operators how to best change process values and associated machining parameters at the injection molding process in order to achieve significantly higher product quality for the final chips at the end of the bonding process

    On-line anomaly detection with advanced independent component analysis of multi-variate residual signals from causal relation networks.

    Get PDF
    Anomaly detection in todays industrial environments is an ambitious challenge to detect possible faults/problems which may turn into severe waste during production, defects, or systems components damage, at an early stage. Data-driven anomaly detection in multi-sensor networks rely on models which are extracted from multi-sensor measurements and which characterize the anomaly-free reference situation. Therefore, significant deviations to these models indicate potential anomalies. In this paper, we propose a new approach which is based on causal relation networks (CRNs) that represent the inner causes and effects between sensor channels (or sensor nodes) in form of partial sub-relations, and evaluate its functionality and performance on two distinct production phases within a micro-fluidic chip manufacturing scenario. The partial relations are modeled by non-linear (fuzzy) regression models for characterizing the (local) degree of influences of the single causes on the effects. An advanced analysis of the multi-variate residual signals, obtained from the partial relations in the CRNs, is conducted. It employs independent component analysis (ICA) to characterize hidden structures in the fused residuals through independent components (latent variables) as obtained through the demixing matrix. A significant change in the energy content of latent variables, detected through automated control limits, indicates an anomaly. Suppression of possible noise content in residuals—to decrease the likelihood of false alarms—is achieved by performing the residual analysis solely on the dominant parts of the demixing matrix. Our approach could detect anomalies in the process which caused bad quality chips (with the occurrence of malfunctions) with negligible delay based on the process data recorded by multiple sensors in two production phases: injection molding and bonding, which are independently carried out with completely different process parameter settings and on different machines (hence, can be seen as two distinct use cases). Our approach furthermore i.) produced lower false alarm rates than several related and well-known state-of-the-art methods for (unsupervised) anomaly detection, and ii.) also caused much lower parametrization efforts (in fact, none at all). Both aspects are essential for the useability of an anomaly detection approach

    WHO/IUIS Allergen Nomenclature: Providing a common language

    Get PDF
    A systematic nomenclature for allergens originated in the early 1980s, when few protein allergens had been described. A group of scientists led by Dr. David G. Marsh developed a nomenclature based on the Linnaean taxonomy, and further established the World Health Organization/International Union of Immunological Societies (WHO/IUIS) Allergen Nomenclature Sub-Committee in 1986. Its stated aim was to standardize the names given to the antigens (allergens) that caused IgE-mediated allergies in humans. The Sub-Committee first published a revised list of allergen names in 1986, which continued to grow with rare publications until 1994. Between 1994 and 2007 the database was a text table online, then converted to a more readily updated website. The allergen list became the Allergen Nomenclature database (www.allergen.org), which currently includes approximately 880 proteins from a wide variety of sources. The Sub-Committee includes experts on clinical and molecular allergology. They review submissions of allergen candidates, using evidence-based criteria developed by the Sub-Committee. The review process assesses the biochemical analysis and the proof of allergenicity submitted, and aims to assign allergen names prior to publication. The Sub-Committee maintains and revises the database, and addresses continuous challenges as new “omics” technologies provide increasing data about potential new allergens. Most journals publishing information on new allergens require an official allergen name, which involves submission of confidential data to the WHO/IUIS Allergen Nomenclature Sub-Committee, sufficient to demonstrate binding of IgE from allergic subjects to the purified protein

    Current (Food) allergenic risk assessment: is it fit for novel foods? status quo and identification of gaps

    Get PDF
    Food allergies are recognized as a global health concern. In order to protect allergic consumers from severe symptoms, allergenic risk assessment for well-known foods and foods containing genetically modified ingredients is installed. However, population is steadily growing and there is a rising need to provide adequate protein-based foods, including novel sources, not yet used for human consumption. In this context safety issues such as a potential increased allergenic risk need to be assessed before marketing novel food sources. Therefore, the established allergenic risk assessment for genetically modified organisms needs to be re-evaluated for its applicability for risk assessment of novel food proteins. Two different scenarios of allergic sensitization have to be assessed. The first scenario is the presence of already known allergenic structures in novel foods. For this, a comparative assessment can be performed and the range of cross-reactivity can be explored, while in the second scenario allergic reactions are observed toward so far novel allergenic structures and no reference material is available. This review summarizes the current analytical methods for allergenic risk assessment, highlighting the strengths and limitations of each method and discussing the gaps in this assessment that need to be addressed in the near future.Austrian Science Fund [FWF SFB F4603]; Ministry of Education, Science and Technological Development of the Republic of Serbia [OI172024]; MINECO, Spain [AGL2014-59771-R]; PROMAR: Projetos Pilotos e a Transformacao de Embarcacoes de Pesca [31-03-05-FEP-0060]info:eu-repo/semantics/publishedVersio

    Challenges for Allergy Diagnosis in Regions with Complex Pollen Exposures

    Get PDF
    Over the past few decades, significant scientific progress has influenced clinical allergy practice. The biological standardization of extracts was followed by the massive identification and characterization of new allergens and their progressive use as diagnostic tools including allergen micro arrays that facilitate the simultaneous testing of more than 100 allergen components. Specific diagnosis is the basis of allergy practice and is always aiming to select the best therapeutic or avoidance intervention. As a consequence, redundant or irrelevant information might be adding unnecessary cost and complexity to daily clinical practice. A rational use of the different diagnostic alternatives would allow a significant improvement in the diagnosis and treatment of allergic patients, especially for those residing in complex pollen exposure areas

    Allergic sensitization: screening methods

    Get PDF
    Experimental in silico, in vitro, and rodent models for screening and predicting protein sensitizing potential are discussed, including whether there is evidence of new sensitizations and allergies since the introduction of genetically modified crops in 1996, the importance of linear versus conformational epitopes, and protein families that become allergens. Some common challenges for predicting protein sensitization are addressed: (a) exposure routes; (b) frequency and dose of exposure; (c) dose-response relationships; (d) role of digestion, food processing, and the food matrix; (e) role of infection; (f) role of the gut microbiota; (g) influence of the structure and physicochemical properties of the protein; and (h) the genetic background and physiology of consumers. The consensus view is that sensitization screening models are not yet validated to definitively predict the de novo sensitizing potential of a novel protein. However, they would be extremely useful in the discovery and research phases of understanding the mechanisms of food allergy development, and may prove fruitful to provide information regarding potential allergenicity risk assessment of future products on a case by case basis. These data and findings were presented at a 2012 international symposium in Prague organized by the Protein Allergenicity Technical Committee of the International Life Sciences Institute’s Health and Environmental Sciences Institute

    Current challenges facing the assessment of the allergenic capacity of food allergens in animal models

    Get PDF
    Food allergy is a major health problem of increasing concern. The insufficiency of protein sources for human nutrition in a world with a growing population is also a significant problem. The introduction of new protein sources into the diet, such as newly developed innovative foods or foods produced using new technologies and production processes, insects, algae, duckweed, or agricultural products from third countries, creates the opportunity for development of new food allergies, and this in turn has driven the need to develop test methods capable of characterizing the allergenic potential of novel food proteins. There is no doubt that robust and reliable animal models for the identification and characterization of food allergens would be valuable tools for safety assessment. However, although various animal models have been proposed for this purpose, to date, none have been formally validated as predictive and none are currently suitable to test the allergenic potential of new foods. Here, the design of various animal models are reviewed, including among others considerations of species and strain, diet, route of administration, dose and formulation of the test protein, relevant controls and endpoints measured

    Abstracts from the Food Allergy and Anaphylaxis Meeting 2016

    Get PDF

    A soft-computing framework for automated optimization of multiple product quality criteria with application to micro-fluidic chip production.

    Get PDF
    We describe a general strategy for optimizing the quality of products of industrial batch processes that comprise multiple production stages. We focus on the particularities of applying this strategy in the field of micro-fluidic chip production. Our approach is based on three interacting components: (i) a new hybrid design of experiments (DoE) strategy that combines expert- and distribution-based space exploration with model-based uncertainty criteria to obtain a representative set of initial samples (i.e., settings of essential machining process parameters), (ii) construction of linear and non-linear predictive mappings from these samples to describe the relation between machining process parameters and resulting quality control (QC) values and (iii) incorporation of these mappings as surrogate fitness estimators into a multi-objective optimization process to discover settings that outperform those routinely used by operators. These optimized settings lead to final products with better quality and/or higher functionality for the clients. The optimization module employs a co-evolutionary strategy we developed that is able to deliver better Pareto non-dominated solutions than the renowned NSGA-II multi-objective solver. We applied our proposed high-level surrogate-based multi-objective strategy both in a single/late-stage optimization scenario and in a more challenging multi-stage scenario, yielding final optimization results that improved parameter settings and thus product quality compared to standard expert-based production process parameterizations
    corecore